Jayadev Acharya
Entropy is a measure of randomness with many applications. We will discuss the problems of estimating the Shannon entropy and Renyi entropies of discrete distributions from samples. We will provide an overview of Shannon entropy, and then treat the problem of estimating Renyi entropy in greater detail. We will discuss some results that were surprising to the authors, and will hopefully be interesting to the audience.
This is joint work with Alon Orlitsky, Ananda Theertha Suresh, and Himanshu Tyagi.